10 resultados para CHD Prediction, Blood Serum Data Chemometrics Methods

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fluoroacetate is a highly toxic species naturally found in plants and in commercial products (compound 1080) for population control of several undesirable animal species. However, it is non-selective and toxic to many other animals including humans, and thus its detection is very important for forensic purposes. This paper presents a sensitive and fast method for the determination of fluoroacetate in blood serum using capillary electrophoresis with capacitively coupled contactless conductivity detection. Serum blood samples were treated with ethanol to remove proteins. The samples were analyzed in BGE containing 15 mmol/L histidine and 30 mmol/L gluconic acid (pH 3.85). The calibration curve was linear up to 75 mu mol/L (R(2) = 0.9995 for N = 12). The detection limit in the blood serum was 0.15 mg/kg, which is smaller than the lethal dose for humans and other animals. Fluoride, a metabolite of the fluoroacetate defluorination, could also be detected for levels greater than 20 mu mol/L, when polybrene was used for reversion of the EOF. CTAB and didecyldimethylammonium bromide are not useful for this task because of the severe reduction of the fluoride level. However, no interference was observed for fluoroacetate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of ATP, ADP, and adenosine in the processes of platelet aggregation, vasodilatation, and coronary flow have been known for many years. The sequential hydrolysis of ATP to adenosine by soluble nucleotidases constitutes the main system for rapid inactivation of circulating adenine nucleotides. Thyroid disorders affect a number of biological factors including adenosine levels in different fractions. Then, we intend to investigate if the soluble nucleotidases responsible for the ATP, ADP, and AMP hydrolysis are affected by variations in the thyroid hormone levels in blood serum from adult rats. Hyperthyroidism was induced by daily intraperitoneal injections of L-thyroxine (T4) (2.5 and 10.0 mu g/100 g body weight, respectively) for 7 or 14 days. Hypothyroidism was induced by thyroidectomy and methimazole (0.05%) added to their drinking water during 7 or 14 days. The treatments efficacy was confirmed by determination of hemodynamic parameters and cardiac hypertrophy evaluation. T4 treatment predominantly inhibited, and hypothyroidism (14 days after thyroidectomy) predominantly increased the ATP, ADP, and AMP hydrolysis in rat blood serum. These results suggest that both excess and deficiency of thyroid hormones can modulate the ATP diphosphohydrolase and 5`-nucleotidase activities in rat blood serum and consequently modulate the effects mediated by these enzymes and their products in vascular system. (C) 2010 International Union of Biochemistry and Molecular Biology, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and aims Toxoplasmic retinochoroiditis may recur months or years after the primary infection. Rupture of dormant cysts in the retina is the accepted hypothesis to explain recurrence. Here, the authors present evidence supporting the presence of Toxoplasma gondii in the peripheral blood of immunocompetent patients. Methods Direct observation by light microscopy and by immunofluorescence assay was performed, and results were confirmed by PCR amplification of parasite DNA. Results The authors studied 20 patients from Erechim, Brazil, including acute infected patients, patients with recurrent active toxoplasmic retinochoroiditis, patients with old toxoplasmic retinal scars, and patients with circulating IgG antibodies against T gondii and absence of ocular lesions. Blood samples were analysed, and T gondii was found in the blood of acutely and chronically infected patients regardless of toxoplasmic retinochoroiditis. Conclusions The results indicate that the parasite may circulate in the blood of immunocompetent individuals and that parasitaemia could be associated with the reactivation of the ocular disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a new parametric method to deal with discrepant experimental results is developed. The method is based on the fit of a probability density function to the data. This paper also compares the characteristics of different methods used to deduce recommended values and uncertainties from a discrepant set of experimental data. The methods are applied to the (137)Cs and (90)Sr published half-lives and special emphasis is given to the deduced confidence intervals. The obtained results are analyzed considering two fundamental properties expected from an experimental result: the probability content of confidence intervals and the statistical consistency between different recommended values. The recommended values and uncertainties for the (137)Cs and (90)Sr half-lives are 10,984 (24) days and 10,523 (70) days, respectively. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dynamical processes that lead to open cluster disruption cause its mass to decrease. To investigate such processes from the observational point of view, it is important to identify open cluster remnants (OCRs), which are intrinsically poorly populated. Due to their nature, distinguishing them from field star fluctuations is still an unresolved issue. In this work, we developed a statistical diagnostic tool to distinguish poorly populated star concentrations from background field fluctuations. We use 2MASS photometry to explore one of the conditions required for a stellar group to be a physical group: to produce distinct sequences in a colour-magnitude diagram (CMD). We use automated tools to (i) derive the limiting radius; (ii) decontaminate the field and assign membership probabilities; (iii) fit isochrones; and (iv) compare object and field CMDs, considering the isochrone solution, in order to verify the similarity. If the object cannot be statistically considered as a field fluctuation, we derive its probable age, distance modulus, reddening and uncertainties in a self-consistent way. As a test, we apply the tool to open clusters and comparison fields. Finally, we study the OCR candidates DoDz 6, NGC 272, ESO 435 SC48 and ESO 325 SC15. The tool is optimized to treat these low-statistic objects and to separate the best OCR candidates for studies on kinematics and chemical composition. The study of the possible OCRs will certainly provide a deep understanding of OCR properties and constraints for theoretical models, including insights into the evolution of open clusters and dissolution rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evidence of jet precession in many galactic and extragalactic sources has been reported in the literature. Much of this evidence is based on studies of the kinematics of the jet knots, which depends on the correct identification of the components to determine their respective proper motions and position angles on the plane of the sky. Identification problems related to fitting procedures, as well as observations poorly sampled in time, may influence the follow-up of the components in time, which consequently might contribute to a misinterpretation of the data. In order to deal with these limitations, we introduce a very powerful statistical tool to analyse jet precession: the cross-entropy method for continuous multi-extremal optimization. Only based on the raw data of the jet components (right ascension and declination offsets from the core), the cross-entropy method searches for the precession model parameters that better represent the data. In this work we present a large number of tests to validate this technique, using synthetic precessing jets built from a given set of precession parameters. With the aim of recovering these parameters, we applied the cross-entropy method to our precession model, varying exhaustively the quantities associated with the method. Our results have shown that even in the most challenging tests, the cross-entropy method was able to find the correct parameters within a 1 per cent level. Even for a non-precessing jet, our optimization method could point out successfully the lack of precession.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyzed ostriches from an equipped farm located in the Brazilian southeast region for the presence of Salmonella spp. This bacterium was investigated in 80 samples of ostrich droppings, 90 eggs, 30 samples of feed and 30 samples of droppings from rodents. Additionally, at slaughter-house this bacterium was investigated in droppings, caecal content, spleen, liver and carcasses from 90 slaughtered ostriches from the studied farm. Also, blood serum of those animals were harvested and submitted to serum plate agglutination using commercial Salmonella Pullorum antigen. No Salmonella spp. was detected in any eggs, caecal content, liver, spleen, carcass and droppings from ostriches and rodents. However, Salmonella Javiana and Salmonella enterica subsp. enterica 4, 12: i:- were isolated from some samples of feed. The serologic test was negative for all samples. Good sanitary farming management and the application of HACCP principles and GMP during the slaughtering process could explain the absence of Salmonella spp. in the tested samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Low-Density Lipoprotein (LDL), often known as ""bad cholesterol"" is one of the responsible to increase the risk of coronary arterial diseases. For this reason, the cholesterol present in the LDL particle has become one of the main parameters to be quantified in routine clinical diagnosis. A number of tools are available to assess LDL particles and estimate the cholesterol concentration in the blood. The most common methods to quantify the LDL in the plasma are the density gradient ultracentrifugation and nuclear magnetic resonance (NMR). However, these techniques require special equipments and can take a long time to provide the results. In this paper, we report on the increase of the Europium emission in Europium-oxytetracycline complex aqueous solutions in the presence of LDL. This increase is proportional to the LDL concentration in the solution. This phenomenum can be used to develop a method to quantify the number of LDL particles in a sample. A comparison between the performances of the oxytetracycline and the tetracycline in the complexes is also made.